|
Non-autoregressive method for Uyghur-Chinese neural machine translation
ZHU Xiangrong, WANG Lei, YANG Yating, DONG Rui, ZHANG Jun
Journal of Computer Applications
2020, 40 (7):
1891-1895.
DOI: 10.11772/j.issn.1001-9081.2019111974
Although the existing autoregressive translation models based on recurrent neural network, convolutional neural network or Transformer have good translation performance, they have the problem of low translation speed due to low decoding parallelism. Therefore, a non-autoregressive model based learning rate optimization strategy was proposed. On the basis of the non-autoregressive sequence model based on iterative optimization, the learning rate adjustment method was changed, which means that warm up was replaced with liner annealing. Firstly, liner annealing was evaluated to be better than warm up; then liner annealing was applied to the non-autoregressive sequence model in order to obtain the optimal balance between translation quality and decoding speed; finally a comparison between this method and the method of autoregressive model was carried out. Experimental results show that compared with the autoregressive model Transformer, when the decoding speed is increased by 2.74 times, this method has the BiLingual Evaluation Understudy (BLEU) score value of translation quality of 41.31, which reached 95.34% of that of the Transformer. It can be seen that the non-autoregressive sequence model of liner annealing can effectively improve the decoding speed under the condition of reducing a little translation quality, which is suitable for the platforms with urgent need for translation speed.
Reference |
Related Articles |
Metrics
|
|